1,192 research outputs found

    A finite element based formulation for sensitivity studies of piezoelectric systems

    Get PDF
    Sensitivity Analysis is a branch of numerical analysis which aims to quantify the affects that variability in the parameters of a numerical model have on the model output. A finite element based sensitivity analysis formulation for piezoelectric media is developed here and implemented to simulate the operational and sensitivity characteristics of a piezoelectric based distributed mode actuator (DMA). The work acts as a starting point for robustness analysis in the DMA technology

    Probabilistic simulation for the certification of railway vehicles

    Get PDF
    The present dynamic certification process that is based on experiments has been essentially built on the basis of experience. The introduction of simulation techniques into this process would be of great interest. However, an accurate simulation of complex, nonlinear systems is a difficult task, in particular when rare events (for example, unstable behaviour) are considered. After analysing the system and the currently utilized procedure, this paper proposes a method to achieve, in some particular cases, a simulation-based certification. It focuses on the need for precise and representative excitations (running conditions) and on their variable nature. A probabilistic approach is therefore proposed and illustrated using an example. First, this paper presents a short description of the vehicle / track system and of the experimental procedure. The proposed simulation process is then described. The requirement to analyse a set of running conditions that is at least as large as the one tested experimentally is explained. In the third section, a sensitivity analysis to determine the most influential parameters of the system is reported. Finally, the proposed method is summarized and an application is presented

    [i]In silico[/i] system analysis of physiological traits determining grain yield and protein concentration for wheat as influenced by climate and crop management

    Get PDF
    Genetic improvement of grain yield (GY) and grain protein concentration (GPC) is impeded by large genotype×environment×management interactions and by compensatory effects between traits. Here global uncertainty and sensitivity analyses of the process-based wheat model SiriusQuality2 were conducted with the aim of identifying candidate traits to increase GY and GPC. Three contrasted European sites were selected and simulations were performed using long-term weather data and two nitrogen (N) treatments in order to quantify the effect of parameter uncertainty on GY and GPC under variable environments. The overall influence of all 75 plant parameters of SiriusQuality2 was first analysed using the Morris method. Forty-one influential parameters were identified and their individual (first-order) and total effects on the model outputs were investigated using the extended Fourier amplitude sensitivity test. The overall effect of the parameters was dominated by their interactions with other parameters. Under high N supply, a few influential parameters with respect to GY were identified (e.g. radiation use efficiency, potential duration of grain filling, and phyllochron). However, under low N, >10 parameters showed similar effects on GY and GPC. All parameters had opposite effects on GY and GPC, but leaf and stem N storage capacity appeared as good candidate traits to change the intercept of the negative relationship between GY and GPC. This study provides a system analysis of traits determining GY and GPC under variable environments and delivers valuable information to prioritize model development and experimental work

    Derivative based global sensitivity measures

    Full text link
    The method of derivative based global sensitivity measures (DGSM) has recently become popular among practitioners. It has a strong link with the Morris screening method and Sobol' sensitivity indices and has several advantages over them. DGSM are very easy to implement and evaluate numerically. The computational time required for numerical evaluation of DGSM is generally much lower than that for estimation of Sobol' sensitivity indices. This paper presents a survey of recent advances in DGSM concerning lower and upper bounds on the values of Sobol' total sensitivity indices S_itotS\_{i}^{tot}. Using these bounds it is possible in most cases to get a good practical estimation of the values of S_itotS\_{i}^{tot} . Several examples are used to illustrate an application of DGSM

    Creating Composite Indicators with DEA and Robustness Analysis: the case of the Technology Achievement Index

    Get PDF
    Composite indicators are regularly used for benchmarking countries’ performance, but equally often stir controversies about the unavoidable subjectivity that is connected with their construction. Data Envelopment Analysis helps to overcome some key limitations, viz., the undesirable dependence of final results from the preliminary normalization of sub-indicators, and, more cogently, from the subjective nature of the weights used for aggregating. Still, subjective decisions remain, and such modelling uncertainty propagates onto countries’ composite indicator values and relative rankings. Uncertainty and sensitivity analysis are therefore needed to assess robustness of final results and to analyze how much each individual source of uncertainty contributes to the output variance. The current paper reports on these issues, using the Technology Achievement Index as an illustration.factor is more important in explaining the observed progress.composite indicators, aggregation, weighting, Internal Market

    Creating composite indicators with DEA and robustness analysis: The case of the technology achievement index.

    Get PDF
    Composite indicators are regularly used for benchmarking countries’ performance, but equally often stir controversies about the unavoidable subjectivity that is connected with their construction. Data Envelopment Analysis helps to overcome some key limitations, viz., the undesirable dependence of final results from the preliminary normalization of sub-indicators, and, more cogently, from the subjective nature of the weights used for aggregating. Still, subjective decisions remain, and such modelling uncertainty propagates onto countries’ composite indicator values and relative rankings. Uncertainty and sensitivity analysis are therefore needed to assess robustness of final results and to analyze how much each individual source of uncertainty contributes to the output variance. The current paper reports on these issues, using the Technology Achievement Index as an illustration.Indexes; Indicators; Robustness; Technology;

    Open TURNS: An industrial software for uncertainty quantification in simulation

    Full text link
    The needs to assess robust performances for complex systems and to answer tighter regulatory processes (security, safety, environmental control, and health impacts, etc.) have led to the emergence of a new industrial simulation challenge: to take uncertainties into account when dealing with complex numerical simulation frameworks. Therefore, a generic methodology has emerged from the joint effort of several industrial companies and academic institutions. EDF R&D, Airbus Group and Phimeca Engineering started a collaboration at the beginning of 2005, joined by IMACS in 2014, for the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial challenges attached to uncertainties, which are transparency, genericity, modularity and multi-accessibility. This paper focuses on OpenTURNS and presents its main features: openTURNS is an open source software under the LGPL license, that presents itself as a C++ library and a Python TUI, and which works under Linux and Windows environment. All the methodological tools are described in the different sections of this paper: uncertainty quantification, uncertainty propagation, sensitivity analysis and metamodeling. A section also explains the generic wrappers way to link openTURNS to any external code. The paper illustrates as much as possible the methodological tools on an educational example that simulates the height of a river and compares it to the height of a dyke that protects industrial facilities. At last, it gives an overview of the main developments planned for the next few years

    A relative entropy rate method for path space sensitivity analysis of stationary complex stochastic dynamics

    Get PDF
    We propose a new sensitivity analysis methodology for complex stochastic dynamics based on the Relative Entropy Rate. The method becomes computationally feasible at the stationary regime of the process and involves the calculation of suitable observables in path space for the Relative Entropy Rate and the corresponding Fisher Information Matrix. The stationary regime is crucial for stochastic dynamics and here allows us to address the sensitivity analysis of complex systems, including examples of processes with complex landscapes that exhibit metastability, non-reversible systems from a statistical mechanics perspective, and high-dimensional, spatially distributed models. All these systems exhibit, typically non-gaussian stationary probability distributions, while in the case of high-dimensionality, histograms are impossible to construct directly. Our proposed methods bypass these challenges relying on the direct Monte Carlo simulation of rigorously derived observables for the Relative Entropy Rate and Fisher Information in path space rather than on the stationary probability distribution itself. We demonstrate the capabilities of the proposed methodology by focusing here on two classes of problems: (a) Langevin particle systems with either reversible (gradient) or non-reversible (non-gradient) forcing, highlighting the ability of the method to carry out sensitivity analysis in non-equilibrium systems; and, (b) spatially extended Kinetic Monte Carlo models, showing that the method can handle high-dimensional problems
    • 

    corecore